-
-
Notifications
You must be signed in to change notification settings - Fork 339
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model parameters option to pass in model tuning, arbitrary parameters #430
Conversation
376b451
to
9149e1f
Compare
Not sure this config should be in the UI. Might be better matched for traitlet's based config. |
@ellisonbg |
Was only able to test UI so far (don't have Bedrock access yet), it looks and works well, new fields are shown, format verification on them works. |
9149e1f
to
aab2f6f
Compare
0cad9d0
to
486b2fe
Compare
227bef0
to
35bd907
Compare
for more information, see https://pre-commit.ci
* log configured model_parameters * fix markdown formatting in docs * fix single quotes and use preferred traitlets CLI syntax
@meeseeksdev please backport to 1.x |
Owee, I'm MrMeeseeks, Look at me. There seem to be a conflict, please backport manually. Here are approximate instructions:
And apply the correct labels and milestones. Congratulations — you did some good work! Hopefully your backport PR will be tested by the continuous integration and merged soon! Remember to remove the If these instructions are inaccurate, feel free to suggest an improvement. |
…tuning, arbitrary parameters
…bitrary parameters (#453) Co-authored-by: Piyush Jain <[email protected]>
…jupyterlab#430) * Endpoint args for SM endpoints * Added model and endpoints kwargs options. * Added configurable option for model parameters. * Updated magics, added model_parameters, removed model_kwargs and endpoint_kwargs. * Fixes %ai error for SM endpoints. * Fixed docs * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * 430 fixes (jupyterlab#2) * log configured model_parameters * fix markdown formatting in docs * fix single quotes and use preferred traitlets CLI syntax --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: david qiu <[email protected]>
…jupyterlab#430) * Endpoint args for SM endpoints * Added model and endpoints kwargs options. * Added configurable option for model parameters. * Updated magics, added model_parameters, removed model_kwargs and endpoint_kwargs. * Fixes %ai error for SM endpoints. * Fixed docs * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * 430 fixes (jupyterlab#2) * log configured model_parameters * fix markdown formatting in docs * fix single quotes and use preferred traitlets CLI syntax --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: david qiu <[email protected]>
Fixes #361
Fixes #444
Description
This PR brings
model_parameters
as an additional option to pass in arbitrary values to the provider class during initialization. This option is available in both Chat UI and Magics.This is useful for passing parameters such as model tuning that affect the response generation by the model.
This is also an appropriate place to pass in custom attributes required by certain providers/models.
The accepted value is a dictionary, with top level keys as the model id (provider:model_id), and value
should be any arbitrary dictionary which is unpacked and passed as-is to the provider class.
Configuring as a startup option
In this sample, the
bedrock
provider will be created with the value formodel_kwargs
whenai21.j2-mid-v1
model is selected.The above will result in the following LLM class to be generated.
Here is another example, where
anthropic
provider will be created with the values formax_tokens
andtemperature
, whenclaude-2
model is selected.The above will result in the following LLM class to be generated.
Configuring as a config file
This configuration can also be specified in a config file in json format. The config can be loaded by specifying the path to the config file.
Here is an example for configuring the
bedrock
provider forai21.j2-mid-v1
model.Magic command samples